The Johnson-Lindenstrauss Lemma
نویسنده
چکیده
Definition 1.1 Let N(0, 1) denote the one dimensional normal distribution. This distribution has density n(x) = e−x 2/2/ √ 2π. LetNd(0, 1) denote the d-dimensional Gaussian distribution, induced by picking each coordinate independently from the standard normal distribution N(0, 1). Let Exp(λ) denote the exponential distribution, with parameter λ. The density function of the exponential distribution is f(x) = λ exp(−λx). Let Γλ,k denote the gamma distribution, with parameters λ and k. The density function of this distribution is gλ,k(x) = λ (λx)k−1 (k−1)! exp(−λx). The cumulative distribution function of Γλ,k is Γλ,k(x) = 1 − exp(−λx) ( 1 + λx 1! + · · ·+ (λx)i i! + · · ·+ (λx)k−1 (k−1)! ) . As we prove below, gamma distribution is how much time one has to wait till k experiments succeed, where an experiment duration distributes according to the exponential distribution. A random variable X has the Poisson distribution, with parameter η > 0 (which is a discrete distribution) if Pr[X = i] = η i i! e −η.
منابع مشابه
236779: Foundations of Algorithms for Massive Datasets Lecture 4 the Johnson-lindenstrauss Lemma
The Johnson-Lindenstrauss lemma and its proof This lecture aims to prove the Johnson–Lindenstrauss lemma. Since the lemma is proved easily with another interesting lemma, a part of this lecture is focused on the proof of this second lemma. At the end, the optimality of the Johnson–Lindenstrauss lemma is discussed. Lemma 1 (Johnson-Lindenstrauss). Given the initial space X ⊆ R n s.t. |X| = N , <...
متن کاملJohnson-lindenstrauss Transformation and Random Projection
We give a brief survey of Johnson-Lindenstrauss lemma. CONTENTS
متن کاملAn Elementary Proof of the Johnson-lindenstrauss Lemma
The Johnson-Lindenstrauss lemma shows that a set of n points in high dimensional Euclidean space can be mapped down into an O(log n== 2) dimensional Euclidean space such that the distance between any two points changes by only a factor of (1). In this note, we prove this lemma using elementary probabilistic techniques.
متن کاملThe Johnson-Lindenstrauss Lemma Meets Compressed Sensing
We show how two fundamental results in analysis related to n-widths and Compressed Sensing are intimately related to the Johnson-Lindenstrauss lemma. Our elementary approach is based on the same concentration inequalities for random inner products that have recently provided simple proofs of the Johnson-Lindenstrauss lemma. We show how these ideas lead to simple proofs of Kashin’s theorems on w...
متن کاملGeometric Optimization April 12 , 2007 Lecture 25 : Johnson Lindenstrauss Lemma
The topic of this lecture is dimensionality reduction. Many problems have been efficiently solved in low dimensions, but very often the solution to low-dimensional spaces are impractical for high dimensional spaces because either space or running time is exponential in dimension. In order to address the curse of dimensionality, one technique is to map a set of points in a high dimensional space...
متن کاملLecture 6 : Johnson - Lindenstrauss Lemma : Dimension Reduction
Observer that for any three points, if the three distances between them are given, then the three angles are fixed. Given n−1 vectors, the vectors together with the origin form a set of n points. In fact, given any n points in Euclidean space (in n−1 dimensions), the Johnson-Lindenstrauss Lemma states that the n points can be placed in O( logn 2 ) dimensions such that distances are preserved wi...
متن کامل